Web Survey Bibliography
During the 1990s cognitive interviewing in its various incarnations (e.g., concurrent think-aloud, retrospective think-aloud, focus group discussions, probes, and memory cues) became the
primary means for evaluating questions (see Lessler and Forsyth, 1995; Conrad and Blair, 1996; Willis and Schechter, 1997; Tourangeau, Ripps, and Rasinski, 2000). By examining the cognitive processes respondents went through while interacting with a survey question, survey methodologists uncovered how small manipulations in the wording of questions influenced respondents’ answers. Another way researchers have historically uncovered the effects of question wording is through experimental field tests where several versions of a question are randomly assigned to a subsample of respondents. Over the past decade, researchers have embedded the bulk of these experiments in Web surveys among undergraduate students due to the affordability of implementing experimental designs in this mode and the technological acuity of college students. Still, if a researcher wanted to assess a single-item among a large heterogeneous audience, their options were limited. Google non-probability Consumer Surveys may provide a solution. However, two questions remain to be answered. First, how do results from these non-probability surveys compare to those from proven cognitive interview techniques. Relatedly, what is the value added by conducting such experiments. In this paper, we compare results from alternate forms of two questions that were tested at NORC at the University of Chicago with 2-waves of cognitive interviews and with Google Consumer surveys (N=4,000) to answer these questions. The results suggest that the Google Consumer survey data do complement the findings from cognitive interviews and the inclusion of the inferred weighted demographics data are useful for use in certain types of studies.
Conference Homepage (abstract)
Web survey bibliography - Stern, M. J. (13)
- An Examination of How Survey Mode Affect Eligibility, Response and Health Condition Reporting Rates...; 2016; Stern, M. J.; Ghandour, R.
- Mode and Eligibility Rates in a Dual-mode Web and Mail Survey ; 2016; Ventura, I.; Bilgen, I.; Stern, M. J.
- Can Using a Mixed Mode Approach Improve the Representativeness and Data Quality in Panel Surveys?; 2016; Stern, M. J.
- Spatial Modeling through GIS to Reveal Error Potent ial in Survey Data: Where, What and How Much ; 2016; English, N.; Ventura, I.; Bilgen, I.; Stern, M. J.
- Where Does the Platform Matter: The Impact of Geographic Clustering in Device Ownership and Internet...; 2015; Bilgen, I.; English, N.; Stern, M. J.; Ventura, I.
- Comparing Field and Laboratory Usability Tests to Assess the Consistency and Mistakes in Web Survey...; 2015; Croen, A.; Gonzales, N.; Ghandour, R.; Stern, M. J.
- Question Grouping and Matrices in Web Surveys: Using Response and Auxiliary Data to Examine Question...; 2014; Bilgen, I., Stern, M. J.
- Can Google Consumer Surveys Help Pre-Test Alternative Versions of a Survey Question?: A Comparison of...; 2013; Stern, M. J., Welch, W. W.
- How Do Different Sampling Techniques Perform in a Web-Only Survey? Results From a Comparison of a Random...; 2013; Bilgen, I., Stern, M. J., Wolter, K.
- Are Response Rates to a Web-Only Survey Spatially Clustered?; 2013; Fiorio, L., Stern, M. J., English, N., Bilgen, I., Curtis, B.
- How Representative are Google Consumer Surveys?: Results From an Analysis of a Google Consumer Survey...; 2013; Krishnamurty, P., Tanenbaum, E., Stern, M. J.
- The effects of item saliency and question design on measurement error in a self-administered survey; 2012; Stern, M. J., D., Mendez, J. D.Smyth, J. D.
- Comparing Check-All and Forced-Choice Question Formats in Web Surveys: The Role of Satisficing, Depth...; 2005; Smyth, J. D., Dillman, D. A., Christian, L. M., Stern, M. J.